Globally convergent conjugate gradient algorithms without the Lipschitz condition for nonconvex optimization

نویسندگان

چکیده

It is well known that under the Wolfe–Powell inexact line search, global convergence of nonlinear conjugate gradient method always requires Lipschitz continuous condition for nonconvex functions. In this paper, we find unnecessary proving particular algorithms if its searching direction has well-known sufficient descent property and trust region feature. Thus, family proposed by Yuan et al. (Numer. Algorithms, 84(2020)) established without functions since they have these two properties. Furthermore, a new algorithm search technique presented, also analysed suitable assumptions. The numerical results show performance competitive with some problems.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Globally convergent modified Perry's conjugate gradient method

Conjugate gradient methods are probably the most famous iterative methods for solving large scale optimization problems in scientific and engineering computation, characterized by the simplicity of their iteration and their low memory requirements. In this paper, we propose a new conjugate gradient method which is based on the MBFGS secant condition by modifying Perry’s method. Our proposed met...

متن کامل

An Efficient Conjugate Gradient Algorithm for Unconstrained Optimization Problems

In this paper, an efficient conjugate gradient method for unconstrained optimization is introduced. Parameters of the method are obtained by solving an optimization problem, and using a variant of the modified secant condition. The new conjugate gradient parameter benefits from function information as well as gradient information in each iteration. The proposed method has global convergence und...

متن کامل

Globally Convergent Cutting Plane Method for Nonconvex Nonsmooth Minimization

Nowadays, solving nonsmooth (not necessarily differentiable) optimization problems plays a very important role in many areas of industrial applications. Most of the algorithms developed so far deal only with nonsmooth convex functions. In this paper, we propose a new algorithm for solving nonsmooth optimization problems that are not assumed to be convex. The algorithm combines the traditional c...

متن کامل

A Globally Convergent Conjugate Gradient Method for Minimizing Self-Concordant Functions on Riemannian Manifolds

Self-concordant functions are a special class of convex functions in Euclidean space introduced by Nesterov. They are used in interior point methods, based on Newton iterations, where they play an important role in solving efficiently certain constrained optimization problems. The concept of self-concordant functions has been defined on Riemannian manifolds by Jiang et al. and a damped Newton m...

متن کامل

A new hybrid conjugate gradient algorithm for unconstrained optimization

In this paper, a new hybrid conjugate gradient algorithm is proposed for solving unconstrained optimization problems. This new method can generate sufficient descent directions unrelated to any line search. Moreover, the global convergence of the proposed method is proved under the Wolfe line search. Numerical experiments are also presented to show the efficiency of the proposed algorithm, espe...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Industrial and Management Optimization

سال: 2023

ISSN: ['1547-5816', '1553-166X']

DOI: https://doi.org/10.3934/jimo.2022257